Source: hanfordsentinel.com 1/14/26
SACRAMENTO, Calif. — California announced an investigation into Elon Musk’s artificial intelligence company xAI on Wednesday, with Gov. Gavin Newsom saying that the social media site owned by the billionaire is a “breeding ground for predators to spread nonconsenual sexually explicit AI deepfakes.”
Grok, the xAI chatbot, includes image-generation features that allow users to morph existing photos into new images. The newly created images are then posted publicly on X.
In some cases, users have created sexually explicit or nonconsensual images based on real people, including altered depictions that appear to show individuals partially or fully undressed. Others have generated images that appear to show minors, prompting criticism that there are not sufficient guardrails to prohibit the creation of child pornography.
The social media site has previously said “we take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary. Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.”
Newsom called the sexualized images being created on the platform “vile.” Attorney General Rob Bonta said his office will use “all tools at our disposal to keep Californians safe.”
“The avalanche of reports detailing the nonconsensual, sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” Bonta said in a …

For those who understand the law, this is what one would call a “slippery slope.” On the one hand, we want to protect our children, WHICH IS HOW IT SHOULD BE. On the other hand, where do we draw the line? Touching children? That’s fair and that’s how it’s been for longer than I can remember. Using actual children to create pornography? Again, that’s fair and that’s how it’s been for longer than I can remember, but now we are talking about banning images that were created with a computer. On the surface, that seems fair (although I would be interested in seeing studies about whether this reduces or increases the creation of child pornography involving physical children), but where does it stop? If this is banned, why not non-fictional stories of stories written by the abusers? What about fictional stories? Might as well ban that, too. But if we were to ban those, what about the stories written be the victims where they describe their abuse in graphic detail in an attempt to share their experiences? And if we are doing all of this, we might as well go ahead and ban using AI to make porn of adults engaging in sex acts, which of course, is no doubt going to lead to using AI to make videos of politicians saying and doing things that contradict their position. Before you know it, you can’t use AI to make anything because we’ve all fallen down that slippery slope. Whether you realize it or not, our Legislators use those convicted of sex offenses to test out new laws. Why? Because there are very few who are going to object to a law targeting such a person, and fewer still who are going to be truly vocal about it. Once passed, precedent has been created. We can’t exactly apply the law to one class of people and not another, now can we? That would be a violation of the laws against equal protection. Take the registration and ankle monitors, for instance. First, we were forcing those convicted of sex offense to wear ankle monitors while on parole. Before you knew it, parole had the authority to strap a monitor on anyone they could fit into one of their special exception categories, i.e., arsonists, gangbangers, etc. And registration? Don’t even get me started. The state of Nevada has discovered a clever way to expand their registration to include practically everyone in the state. How? Because gambling is legal, and under the law, if you are going to work in a place that has something in there related to actual gambling, i.e., a slot machine, then anyone working there has to register with the gambling commission. My ex had to register in the state to work as an exotic dancer for the day we spent in Reno while driving through there. As you read this, please understand that I am not, in any way, shape, or form, advocating that we should somehow be encouraging the use of AI to create and distribute child pornography. To the contrary, I find the idea repugnant and distasteful to the extreme. The issue I am trying to present here is what happens once we start down that slope, as there is no way to stop our descent once started. The hill only gets slipperier and steeper. Once we start down that slope, there is absolutely no turning back. So, what is the answer? Perhaps this is just one of those issues where there isn’t a solution. At the end of the day, people are going to do what people are going to do. Knowing this, which would you rather? That they confine their lusts to a computer-generated image or an image that used an actual child?